On the approximation by single hidden layer feedforward neural networks with fixed weights
نویسندگان
چکیده
Single hidden layer feedforward neural networks (SLFNs) with fixed weights possess the universal approximation property provided that approximated functions are univariate. But this phenomenon does not lay any restrictions on the number of neurons in the hidden layer. The more this number, the more the probability of the considered network to give precise results. In this note, we constructively prove that SLFNs with the fixed weight 1 and two neurons in the hidden layer can approximate any continuous function on a compact subset of the real line. The proof is implemented by a step by step construction of a universal sigmoidal activation function. This function has nice properties such as computability, smoothness and weak monotonicity. The applicability of the obtained result is demonstrated in various numerical examples. Finally, we show that SLFNs with fixed weights cannot approximate all continuous multivariate functions.
منابع مشابه
Existence and uniqueness results for neural network approximations
Some approximation theoretic questions concerning a certain class of neural networks are considered. The networks considered are single input, single output, single hidden layer, feedforward neural networks with continuous sigmoidal activation functions, no input weights but with hidden layer thresholds and output layer weights. Specifically, questions of existence and uniqueness of best approx...
متن کاملEfficient Higher-Order Neural Networks for Classification and Function Approximation
This paper introduces a class of higher-order networks called pi-sigma networks (PSNs). PSNs are feedforward networks with a single \hidden" layer of linear summing units, and with product units in the output layer. A PSN uses these product units to indirectly incorporate the capabilities of higher-order networks while greatly reducing network complexity. PSNs have only one layer of adjustable ...
متن کاملApproximation by Superpositions of a Sigmoidal Function*
Abstr,,ct. In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set ofaffine functionals can uniformly approximate any continuous function of n real variables with support in the unit hypercube; only mild conditions are imposed on the univariate function. Our results settle an open question about representability in the class of sing...
متن کاملApproximation by Superpositions of a Sigmoidal Function*
Abstr,,ct. In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set ofaffine functionals can uniformly approximate any continuous function of n real variables with support in the unit hypercube; only mild conditions are imposed on the univariate function. Our results settle an open question about representability in the class of sing...
متن کاملWeighted Least Squares Scheme for Reducing Effects of Outliers in Regression based on Extreme Learning Machine
Neural networks have been massively used in regression problems due to their ability to approximate complex nonlinear mappings directly from input patterns. However, collected data for training networks often include outliers which affect final results. This paper presents an approach for training single hidden-layer feedforward neural networks (SLFNs) using weighted least-squares scheme which ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural networks : the official journal of the International Neural Network Society
دوره 98 شماره
صفحات -
تاریخ انتشار 2018